Analysis of co-articulation regions for performance-driven facial animation
نویسندگان
چکیده
A facial gesture analysis procedure is presented for the control of animated faces. Facial images are partitioned into a set of local, independently actuated regions of appearance change termed co-articulation regions (CRs). Each CR is parameterized by the activation level of a set of face gestures that affect the region. The activation of a CR is analyzed using independent component analysis (ICA) on a set of training images acquired from an actor. Gesture intensity classification is performed in ICA space by correlation to training samples. Correlation in ICA space proves to be an efficient and stable method for gesture intensity classification with limited training data. A discrete sample-based synthesis method is also presented. An artist creates an actor-independent reconstruction sample database that is indexed with CR state information analyzed in real time from video. Copyright# 2004 John Wiley & Sons, Ltd.
منابع مشابه
Improving naturalness of visual speech synthesis
Facial animation has progressed significantly over the past few years and a variety of algorithms and techniques now make it possible to create highly realistic characters. Based on the author’s visual feature database for speechreading and the development of 3D modelling, a Hungarian talking head has been created. Our general approach is to use both static and dynamic observations of natural s...
متن کاملRealistic 3D facial animation parameters from mirror-reflected multi-view video
In this paper, a robust, accurate and inexpensive approach to estimate 3D facial motion from multi-view video is proposed, where two mirrors located near one’s cheeks can reflect the side views of markers on one’s face. Nice properties of mirrored images are utilized to simplify the proposed tracking algorithm significantly, while a Kalman filter is employed to reduce the noise and to predict t...
متن کاملTowards Automatic Performance Driven Animation Between Multiple Types of Facial Model
In this paper we describe a method for re-mapping animation parameters between multiple types of facial model for performance driven animation. A facial performance can be analysed in terms of a set of facial action parameter trajectories using a modified appearance model with modes of variation encoding specific facial actions which we can pre-define.
متن کاملOnline Expression Mapping for Performance-Driven Facial Animation
Recently, performance-driven facial animation has been popular in various entertainment area, such as game, animation movie, and advertisement. With the easy use of motion capture data from a performer’s face, the resulting animated faces are far more natural and lifelike. However, when the characteristic features between live performer and animated character are quite different, expression map...
متن کاملChapter 8 FACIAL MOTION CAPTURING USING AN EXPLANATION-BASED APPROACH
Building deformation models using the motions captured from real video sequences is becoming a popular method in facial animation. In this paper, we propose an explanation-based facial motion tracking algorithm based on a piecewise Bézier volume deformation model (PBVD). The PBVD is a suitable model both for the synthesis and the analysis of facial images. It is linear and independent of the fa...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Visualization and Computer Animation
دوره 15 شماره
صفحات -
تاریخ انتشار 2004